7 research outputs found

    Real-Time Monitoring and Fault Diagnostics in Roll-To-Roll Manufacturing Systems

    Full text link
    A roll-to-roll (R2R) process is a manufacturing technique involving continuous processing of a flexible substrate as it is transferred between rotating rolls. It integrates many additive and subtractive processing techniques to produce rolls of product in an efficient and cost-effective way due to its high production rate and mass quantity. Therefore, the R2R processes have been increasingly implemented in a wide range of manufacturing industries, including traditional paper/fabric production, plastic and metal foil manufacturing, flexible electronics, thin film batteries, photovoltaics, graphene films production, etc. However, the increasing complexity of R2R processes and high demands on product quality have heightened the needs for effective real-time process monitoring and fault diagnosis in R2R manufacturing systems. This dissertation aims at developing tools to increase system visibility without additional sensors, in order to enhance real-time monitoring, and fault diagnosis capability in R2R manufacturing systems. First, a multistage modeling method is proposed for process monitoring and quality estimation in R2R processes. Product-centric and process-centric variation propagation are introduced to characterize variation propagation throughout the system. The multistage model mainly focuses on the formulation of process-centric variation propagation, which uniquely exists in R2R processes, and the corresponding product quality measurements with both physical knowledge and sensor data analysis. Second, a nonlinear analytical redundancy method is proposed for sensor validation to ensure the accuracy of sensor measurements for process and quality control. Parity relations based on nonlinear observation matrix are formulated to characterize system dynamics and sensor measurements. Robust optimization is designed to identify the coefficient of parity relations that can tolerate a certain level of measurement noise and system disturbances. The effect of the change of operating conditions on the value of the optimal objective function – parity residuals and the optimal design variables – parity coefficients are evaluated with sensitivity analysis. Finally, a multiple model approach for anomaly detection and fault diagnosis is introduced to improve the diagnosability under different operating regimes. The growing structure multiple model system (GSMMS) is employed, which utilizes Voronoi sets to automatically partition the entire operating space into smaller operating regimes. The local model identification problem is revised by formulating it into an optimization problem based on the loss minimization framework and solving with the mini-batch stochastic gradient descent method instead of least squares algorithms. This revision to the GSMMS method expands its capability to handle the local model identification problems that cannot be solved with a closed-form solution. The effectiveness of the models and methods are determined with testbed data from an R2R process. The results show that those proposed models and methods are effective tools to understand variation propagation in R2R processes and improve estimation accuracy of product quality by 70%, identify the health status of sensors promptly to guarantee data accuracy for modeling and decision making, and reduce false alarm rate and increase detection power under different operating conditions. Eventually, those tools developed in this thesis contribute to increase the visibility of R2R manufacturing systems, improve productivity and reduce product rejection rate.PHDMechanical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/146114/1/huanyis_1.pd

    Manufacturing productivity and energy efficiency: a stochastic efficiency frontier analysis

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/113674/1/er3368.pd

    Targeted collapse regularized autoencoder for anomaly detection: black hole at the center

    Full text link
    Autoencoders have been extensively used in the development of recent anomaly detection techniques. The premise of their application is based on the notion that after training the autoencoder on normal training data, anomalous inputs will exhibit a significant reconstruction error. Consequently, this enables a clear differentiation between normal and anomalous samples. In practice, however, it is observed that autoencoders can generalize beyond the normal class and achieve a small reconstruction error on some of the anomalous samples. To improve the performance, various techniques propose additional components and more sophisticated training procedures. In this work, we propose a remarkably straightforward alternative: instead of adding neural network components, involved computations, and cumbersome training, we complement the reconstruction loss with a computationally light term that regulates the norm of representations in the latent space. The simplicity of our approach minimizes the requirement for hyperparameter tuning and customization for new applications which, paired with its permissive data modality constraint, enhances the potential for successful adoption across a broad range of applications. We test the method on various visual and tabular benchmarks and demonstrate that the technique matches and frequently outperforms alternatives. We also provide a theoretical analysis and numerical simulations that help demonstrate the underlying process that unfolds during training and how it can help with anomaly detection. This mitigates the black-box nature of autoencoder-based anomaly detection algorithms and offers an avenue for further investigation of advantages, fail cases, and potential new directions.Comment: 16 pages, 4 figures, 4 table

    A Novel Two-level Causal Inference Framework for On-road Vehicle Quality Issues Diagnosis

    Full text link
    In the automotive industry, the full cycle of managing in-use vehicle quality issues can take weeks to investigate. The process involves isolating root causes, defining and implementing appropriate treatments, and refining treatments if needed. The main pain-point is the lack of a systematic method to identify causal relationships, evaluate treatment effectiveness, and direct the next actionable treatment if the current treatment was deemed ineffective. This paper will show how we leverage causal Machine Learning (ML) to speed up such processes. A real-word data set collected from on-road vehicles will be used to demonstrate the proposed framework. Open challenges for vehicle quality applications will also be discussed.Comment: Accepted by NeurIPS 2022 Workshop on Causal Machine Learning for Real-World Impact (CML4Impact 2022

    Twofold Variation Propagation Modeling and Analysis for Roll-to-Roll Manufacturing Systems

    No full text

    Correction to: Comparative effectiveness and safety of non-vitamin K antagonists for atrial fibrillation in clinical practice: GLORIA-AF Registry

    No full text
    International audienceIn this article, the name of the GLORIA-AF investigator Anastasios Kollias was given incorrectly as Athanasios Kollias in the Acknowledgements. The original article has been corrected

    Patterns of oral anticoagulant use and outcomes in Asian patients with atrial fibrillation: a post-hoc analysis from the GLORIA-AF Registry

    Get PDF
    Background: Previous studies suggested potential ethnic differences in the management and outcomes of atrial fibrillation (AF). We aim to analyse oral anticoagulant (OAC) prescription, discontinuation, and risk of adverse outcomes in Asian patients with AF, using data from a global prospective cohort study. Methods: From the GLORIA-AF Registry Phase II-III (November 2011-December 2014 for Phase II, and January 2014-December 2016 for Phase III), we analysed patients according to their self-reported ethnicity (Asian vs. non-Asian), as well as according to Asian subgroups (Chinese, Japanese, Korean and other Asian). Logistic regression was used to analyse OAC prescription, while the risk of OAC discontinuation and adverse outcomes were analysed through Cox-regression model. Our primary outcome was the composite of all-cause death and major adverse cardiovascular events (MACE). The original studies were registered with ClinicalTrials.gov, NCT01468701, NCT01671007, and NCT01937377. Findings: 34,421 patients were included (70.0 ± 10.5 years, 45.1% females, 6900 (20.0%) Asian: 3829 (55.5%) Chinese, 814 (11.8%) Japanese, 1964 (28.5%) Korean and 293 (4.2%) other Asian). Most of the Asian patients were recruited in Asia (n = 6701, 97.1%), while non-Asian patients were mainly recruited in Europe (n = 15,449, 56.1%) and North America (n = 8378, 30.4%). Compared to non-Asian individuals, prescription of OAC and non-vitamin K antagonist oral anticoagulant (NOAC) was lower in Asian patients (Odds Ratio [OR] and 95% Confidence Intervals (CI): 0.23 [0.22-0.25] and 0.66 [0.61-0.71], respectively), but higher in the Japanese subgroup. Asian ethnicity was also associated with higher risk of OAC discontinuation (Hazard Ratio [HR] and [95% CI]: 1.79 [1.67-1.92]), and lower risk of the primary composite outcome (HR [95% CI]: 0.86 [0.76-0.96]). Among the exploratory secondary outcomes, Asian ethnicity was associated with higher risks of thromboembolism and intracranial haemorrhage, and lower risk of major bleeding. Interpretation: Our results showed that Asian patients with AF showed suboptimal thromboembolic risk management and a specific risk profile of adverse outcomes; these differences may also reflect differences in country-specific factors. Ensuring integrated and appropriate treatment of these patients is crucial to improve their prognosis. Funding: The GLORIA-AF Registry was funded by Boehringer Ingelheim GmbH
    corecore